On Unique Decodability, McMillan’s Theorem and the Expected Length of Codes

نویسنده

  • Riccardo Leonardi
چکیده

In this paper we propose a revisitation of the topic of unique decodability and of some of the related fundamental theorems. It is widely believed that, for any discrete source X, every “uniquely decodable” block code satisfies E[l(X1, X2, · · · , Xn)] ≥ H(X1, X2, . . . , Xn), where X1, X2, . . . , Xn are the first n symbols of the source, E[l(X1, X2, · · · , Xn)] is the expected length of the code for those symbols and H(X1, X2, . . . , Xn) is their joint entropy. We show that, for certain sources with memory, the above inequality only holds if a limiting definition of “uniquely decodable code” is considered. In particular, the above inequality is usually assumed to hold for any “practical code” due to a debatable application of McMillan’s theorem to sources with memory. We thus propose a clarification of the topic, also providing extended versions of McMillan’s theorem and of the Sardinas Patterson test to be used for Markovian sources. This work terminates also with the following interesting remark: both McMillan’s original theorem and ours are equivalent to Shannon’s theorem on the capacity of noiseless channels.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Complexity and sliding-block decodability

A constrained system, or sofic system, S is the set of symbol strings generated by the finite-length paths through a finite labeled, directed graph. Karabed and Marcus, extending the work of Adler, Coppersmith, and Hassner, used the technique of state-splitting to prove the existence of a noncatastrophic, rate p : q finite-state encoder from binary data into S for any input word length p and co...

متن کامل

Information theory—homework exercises

1 Entropy, source coding Problem 1 (Alternative definition of unique decodability) An f : X → Y∗ code is called uniquely decodable if for any messages u = u1 · · ·uk and v = v1 · · · vk (where u1, vi, . . . , uk, vk ∈ X ) with f(u1)f(u2) · · · f(uk) = f(v1)f(v2) · · · f(vk), we have ui = vi for all i. That is, as opposed to the definition given in class, we require that the codes of any pair of...

متن کامل

Unique decodability of bigram counts by finite automata

We revisit the problem of deciding whether a given string is uniquely decodable from its bigram counts by means of a finite automaton. An efficient algorithm for constructing a polynomial-size nondeterministic finite automaton that decides unique decodability is given. Conversely, we show that the minimum deterministic finite automaton for deciding unique decodability has at least exponentially...

متن کامل

On the Unique Decodability of Insertion-Correcting Codes Beyond the Guarantee

Unlike the space of received words generated by substitution errors, the space of received words generated by insertion errors is infinite. Given an arbitrary code, it is possible for there to exist an infinite number of received words that are unique to a particular codeword. This work explores the extent to which an arbitrary insertion-correcting code can take advantage of this fact. Such que...

متن کامل

Combinatorial bounds for list decoding

Informally, an error-correcting code has “nice” listdecodability properties if every Hamming ball of “large” radius has a “small” number of codewords in it. Here, we report linear codes with non-trivial list-decodability: i.e., codes of large rate that are nicely listdecodable, and codes of large distance that are not nicely list-decodable. Specifically, on the positive side, we show that there...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008